Chi-square lower bounds

نویسنده

  • Mark G. Low
چکیده

The information inequality has been shown to be an effective tool for providing lower bounds for the minimax risk. Bounds based on the chisquare distance can sometimes offer a considerable improvement especially when applied iteratively. This paper compares these two methods in four examples including the bounded normal mean problem as well as obervations from a Poisson distribution.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bounds for Stop-Loss Premium Under Restrictions on the Chi-Square Statistic

Stop-loss reinsurance is one type of reinsurance contract that has attracted recent attention. In the simplest form of this contract, a reinsurer agrees to pay all losses of the insurer in excess of an agreed limit. This paper concerns the computation of bounds on the stop{loss premium when the loss distribution is unknown, but information about past claim experience is available in the form of...

متن کامل

Lower Bounds on Bayes Factors for Multinomial Distributions, with Application to Chi-squared Tests of Fit1

Lower bounds on Bayes factors in favor of the null hypothesis in multinomial tests of point null hypotheses are developed. These are then applied to derive lower bounds on Bayes factors in both exact and asymptotic chi-squared testing situations. The general conclusion is that the lower bounds tend to be substantially larger than P-values, raising serious questions concerning the routine use of...

متن کامل

The supremum of Chi-Square processes

We describe a lower bound for the critical value of the supremum of a ChiSquare process. This bound can be approximated using a MCQMC simulation. We compare numerically this bound with the upper bound given by Davies, only suitable for a regular Chi-Square process. In a second part, we focus a non regular Chi-Square process : the Ornstein-Uhlenbeck Chi-Square process. Recently, Rabier et al. (2...

متن کامل

Bounds on Nonsymmetric Divergence Measure in terms of Other Symmetric and Nonsymmetric Divergence Measures

Vajda (1972) studied a generalized divergence measure of Csiszar's class, so called "Chi-m divergence measure." Variational distance and Chi-square divergence are the special cases of this generalized divergence measure at m = 1 and m = 2, respectively. In this work, nonparametric nonsymmetric measure of divergence, a particular part of Vajda generalized divergence at m = 4, is taken and charac...

متن کامل

On the small sample distributions of the empirical loglikelihood ratio

The author shows that the empirical loglikelihood ratio is also pivotal for the multivariate normal mean and derives E distributions, a family of distributions equivalent to Hotelling’s T 2 in the context of empirical likelihood. For empirical likelihood ratio confidence regions based on estimating equations, the author discusses bounds on their coverage levels derived from the atoms of the E d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010